Natural language grammatical inference: a comparison of recurrent neural networks and machine learning methods
نویسندگان
چکیده
We consider the task of training a neural network to classify natural language sentences as grammatical or ungrammatical, thereby exhibiting the same kind of discriminatory power provided by the Principles and Parameters linguistic framework, or Government and Binding theory. We investigate the following models: feed-forward neural networks, Frasconi-Gori-Soda and Back-Tsoi locally recurrent neural networks, Williams and Zipser and Elman recurrent neural networks, Euclidean and edit-distance nearest-neighbors, and decision trees. Non-neural network machine learning methods are included primarily for comparison. We find that the Elman and Williams & Zipser recurrent neural networks are able to find a representation for the grammar which we believe is more parsimonious. These models exhibit the best performance.
منابع مشابه
Natural Language Grammatical Inference with Recurrent Neural Networks
This paper examines the inductive inference of a complex grammar with neural networks – specifically, the task considered is that of training a network to classify natural language sentences as grammatical or ungrammatical, thereby exhibiting the same kind of discriminatory power provided by the Principles and Parameters linguistic framework, or Government-and-Binding theory. Neural networks ar...
متن کاملLearning a class of large finite state machines with a recurrent neural network
-One o f the issues in any learning model is how it scales with problem size. The problem o f learning finite state machine (FSMs) from examples with recurrent neural networks has been extensively explored. However, these results are somewhat disappointing in the sense that the machines that can be learned are too small to be competitive with existing grammatical inference algorithms. We show t...
متن کاملA Hybrid Connectionist-Symbolic Approach to Regular Grammatical Inference Based on Neural Learning and Hierarchical Clustering
l~ecently, recurrent neural networks (RNNs) have been used to infer regular grammars from positive and negative examples. Several clustering algorithms have been suggested to extract a finite state automaton (FSA) from the activation patterns of a trained net. However, the consistency with the examples of the extracted FSA is not guaranteed in these methods, and typically, some parameter of the...
متن کاملEmploying External Rich Knowledge for Machine Comprehension
Recently proposed machine comprehension (MC) application is an effort to deal with natural language understanding problem. However, the small size of machine comprehension labeled data confines the application of deep neural networks architectures that have shown advantage in semantic inference tasks. Previous methods use a lot of NLP tools to extract linguistic features but only gain little im...
متن کاملOnline Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks
This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical offline grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even fro...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1995